Mutual Embeddings

نویسندگان

  • Ilker Nadi Bozkurt
  • Hai Huang
  • Bruce M. Maggs
  • Andréa W. Richa
  • Maverick Woo
چکیده

This paper introduces a type of graph embedding called a mutual embedding. A mutual embedding between two n-node graphs G1 = (V1, E1) and G2 = (V2, E2) is an identification of the vertices of V1 and V2, i.e., a bijection π : V1 → V2, together with an embedding of G1 into G2 and an embedding of G2 into G1 where in the embedding of G1 into G2, each node u of G1 is mapped to π(u) in G2 and in the embedding of G2 into G1 each node v of G2 is mapped to π (v) in G1. The identification of vertices in G1 and G2 constrains the two embeddings so that it is not always possible for both to exhibit small congestion and dilation, even if there are traditional one-way embeddings in both directions with small congestion and dilation. Mutual embeddings arise in the context of finding preconditioners for accelerating the convergence of iterative methods for solving systems of linear equations. We present mutual embeddings between several types of graphs such as linear arrays, cycles, trees, and meshes, prove lower bounds on mutual embeddings

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hashing with Mutual Information

Binary vector embeddings enable fast nearest neighbor retrieval in large databases of high-dimensional objects, and play an important role in many practical applications, such as image and video retrieval. We study the problem of learning binary vector embeddings under a supervised setting, also known as hashing. We propose a novel supervised hashing method based on optimizing an information-th...

متن کامل

Modular Formalization of Reactive Modules in COQ

We present modular formalizations of the model specification language Reactive Modules and the temporal logic CTL∗ in the proof assistant Coq. In our formalizations, both shallow and deep embeddings of each language are given. The modularity of our formalizations allows proofs and theorems to be reused across different embeddings. We illustrate the advantages of our modular formalizations by pr...

متن کامل

Swivel: Improving Embeddings by Noticing What's Missing

We present Submatrix-wise Vector Embedding Learner (Swivel), a method for generating lowdimensional feature embeddings from a feature co-occurrence matrix. Swivel performs approximate factorization of the point-wise mutual information matrix via stochastic gradient descent. It uses a piecewise loss with special handling for unobserved co-occurrences, and thus makes use of all the information in...

متن کامل

Random walks on discourse spaces: a new generative language model with applications to semantic word embeddings

Semantic word embeddings represent the meaning of a word via a vector, and are created by diverse methods such as Latent Semantic Analysis (LSA), generative text models such as topic models, matrix factorization, neural nets, and energy-based models. Many methods use nonlinear operations —such as Pairwise Mutual Information or PMI— on co-occurrence statistics, and have hand-tuned hyperparameter...

متن کامل

word2vec Skip-Gram with Negative Sampling is a Weighted Logistic PCA

Mikolov et al. (2013) introduced the skip-gram formulation for neural word embeddings, wherein one tries to predict the context of a given word. Their negative-sampling algorithm improved the computational feasibility of training the embeddings. Due to their state-of-the-art performance on a number of tasks, there has been much research aimed at better understanding it. Goldberg and Levy (2014)...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Interconnection Networks

دوره 15  شماره 

صفحات  -

تاریخ انتشار 2015